9 research outputs found

    Mid to Late Season Weed Detection in Soybean Production Fields Using Unmanned Aerial Vehicle and Machine Learning

    Get PDF
    Mid-late season weeds are those that escape the early season herbicide applications and those that emerge late in the season. They might not affect the crop yield, but if uncontrolled, will produce a large number of seeds causing problems in the subsequent years. In this study, high-resolution aerial imagery of mid-season weeds in soybean fields was captured using an unmanned aerial vehicle (UAV) and the performance of two different automated weed detection approaches – patch-based classification and object detection was studied for site-specific weed management. For the patch-based classification approach, several conventional machine learning models on Haralick texture features were compared with the Mobilenet v2 based convolutional neural network (CNN) model for their classification performance. The results showed that the CNN model had the best classification performance for individual patches. Two different image slicing approaches – patches with and without overlap were tested, and it was found that slicing with overlap leads to improved weed detection but with higher inference time. For the object detection approach, two models with different network architectures, namely Faster RCNN and SSD were evaluated and compared. It was found that Faster RCNN had better overall weed detection performance than the SSD with similar inference time. Also, it was found that Faster RCNN had better detection performance and shorter inference time compared to the patch-based CNN with overlapping image slicing. The influence of spatial resolution on weed detection accuracy was investigated by simulating the UAV imagery captured at different altitudes. It was found that Faster RCNN achieves similar performance at a lower spatial resolution. The inference time of Faster RCNN was evaluated using a regular laptop. The results showed the potential of on-farm near real-time weed detection in soybean production fields by capturing UAV imagery with lesser overlap and processing them with a pre-trained deep learning model, such as Faster RCNN, in regular laptops and mobile devices. Advisor: Yeyin Sh

    Mid to Late Season Weed Detection in Soybean Production Fields Using Unmanned Aerial Vehicle and Machine Learning

    Get PDF
    Mid-late season weeds are those that escape the early season herbicide applications and those that emerge late in the season. They might not affect the crop yield, but if uncontrolled, will produce a large number of seeds causing problems in the subsequent years. In this study, high-resolution aerial imagery of mid-season weeds in soybean fields was captured using an unmanned aerial vehicle (UAV) and the performance of two different automated weed detection approaches – patch-based classification and object detection was studied for site-specific weed management. For the patch-based classification approach, several conventional machine learning models on Haralick texture features were compared with the Mobilenet v2 based convolutional neural network (CNN) model for their classification performance. The results showed that the CNN model had the best classification performance for individual patches. Two different image slicing approaches – patches with and without overlap were tested, and it was found that slicing with overlap leads to improved weed detection but with higher inference time. For the object detection approach, two models with different network architectures, namely Faster RCNN and SSD were evaluated and compared. It was found that Faster RCNN had better overall weed detection performance than the SSD with similar inference time. Also, it was found that Faster RCNN had better detection performance and shorter inference time compared to the patch-based CNN with overlapping image slicing. The influence of spatial resolution on weed detection accuracy was investigated by simulating the UAV imagery captured at different altitudes. It was found that Faster RCNN achieves similar performance at a lower spatial resolution. The inference time of Faster RCNN was evaluated using a regular laptop. The results showed the potential of on-farm near real-time weed detection in soybean production fields by capturing UAV imagery with lesser overlap and processing them with a pre-trained deep learning model, such as Faster RCNN, in regular laptops and mobile devices. Advisor: Yeyin Sh

    Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System

    Get PDF
    Unmanned aircraft systems (UAS) provide an efficient way to phenotype cropmorphology with spectral traits such as plant height, canopy cover and various vegetation indices (VIs) providing information to elucidate genotypic responses to the environment. In this study, we investigated the potential use of UAS-derived traits to elucidate biomass, nitrogen and chlorophyll content in sorghum under nitrogen stress treatments. A nitrogen stress trial located in Nebraska, USA, contained 24 different sorghum lines, 2 nitrogen treatments and 8 replications, for a total of 384 plots. Morphological and spectral traits including plant height, canopy cover and various VIs were derived from UAS flights with a true-color RGB camera and a 5-band multispectral camera at early, mid and late growth stages across the sorghum growing season in 2017. Simple and multiple regression models were investigated for sorghum biomass, nitrogen and chlorophyll content estimations using the derived morphological and spectral traits along with manual ground truthed measurements. Results showed that, the UAS-derived plant height was strongly correlated with manually measured plant height (r = 0.85); and the UAS-derived biomass using plant height, canopy cover and VIs had strong exponential correlations with the sampled biomass of fresh stalks and leaves (maximum r = 0.85) and the biomass of dry stalks and leaves (maximum r = 0.88). The UAS-derived VIs were moderately correlated with the laboratory measured leaf nitrogen content (r = 0.52) and the measured leaf chlorophyll content (r = 0.69) in each plot. The methods developed in this study will facilitate genetic improvement and agronomic studies that require assessment of stress responses in large-scale field trials

    Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System

    Get PDF
    Unmanned aircraft systems (UAS) provide an efficient way to phenotype crop morphology with spectral traits such as plant height, canopy cover and various vegetation indices (VIs) providing information to elucidate genotypic responses to the environment. In this study, we investigated the potential use of UAS-derived traits to elucidate biomass, nitrogen and chlorophyll content in sorghum under nitrogen stress treatments. A nitrogen stress trial located in Nebraska, USA, contained 24 different sorghum lines, 2 nitrogen treatments and 8 replications, for a total of 384 plots. Morphological and spectral traits including plant height, canopy cover and various VIs were derived from UAS flights with a true-color RGB camera and a 5-band multispectral camera at early, mid and late growth stages across the sorghum growing season in 2017. Simple and multiple regression models were investigated for sorghum biomass, nitrogen and chlorophyll content estimations using the derived morphological and spectral traits along with manual ground truthed measurements. Results showed that, the UAS-derived plant height was strongly correlated with manually measured plant height (r = 0.85); and the UAS-derived biomass using plant height, canopy cover and VIs had strong exponential correlations with the sampled biomass of fresh stalks and leaves (maximum r = 0.85) and the biomass of dry stalks and leaves (maximum r = 0.88). The UAS-derived VIs were moderately correlated with the laboratory measured leaf nitrogen content (r = 0.52) and the measured leaf chlorophyll content (r = 0.69) in each plot. The methods developed in this study will facilitate genetic improvement and agronomic studies that require assessment of stress responses in large-scale field trials

    Principal variable selection to explain grain yield variation in winter wheat from features extracted from UAV imagery

    Get PDF
    Background: Automated phenotyping technologies are continually advancing the breeding process. However, collecting various secondary traits throughout the growing season and processing massive amounts of data still take great efforts and time. Selecting a minimum number of secondary traits that have the maximum predictive power has the potential to reduce phenotyping efforts. The objective of this study was to select principal features extracted from UAV imagery and critical growth stages that contributed the most in explaining winter wheat grain yield. Five dates of multispectral images and seven dates of RGB images were collected by a UAV system during the spring growing season in 2018. Two classes of features (variables), totaling to 172 variables, were extracted for each plot from the vegetation index and plant height maps, including pixel statistics and dynamic growth rates. A parametric algorithm, LASSO regression (the least angle and shrinkage selection operator), and a non-parametric algorithm, random forest, were applied for variable selection. The regression coefficients estimated by LASSO and the permutation importance scores provided by random forest were used to determine the ten most important variables influencing grain yield from each algorithm. Results: Both selection algorithms assigned the highest importance score to the variables related with plant height around the grain filling stage. Some vegetation indices related variables were also selected by the algorithms mainly at earlier to mid growth stages and during the senescence. Compared with the yield prediction using all 172 variables derived from measured phenotypes, using the selected variables performed comparable or even better. We also noticed that the prediction accuracy on the adapted NE lines (r = 0.58–0.81) was higher than the other lines (r = 0.21–0.59) included in this study with different genetic backgrounds. Conclusions: With the ultra-high resolution plot imagery obtained by the UAS-based phenotyping we are now able to derive more features, such as the variation of plant height or vegetation indices within a plot other than just an averaged number, that are potentially very useful for the breeding purpose. However, too many features or variables can be derived in this way. The promising results from this study suggests that the selected set from those variables can have comparable prediction accuracies on the grain yield prediction than the full set of them but possibly resulting in a better allocation of efforts and resources on phenotypic data collection and processing

    Principal variable selection to explain grain yield variation in winter wheat from features extracted from UAV imagery

    Get PDF
    Background: Automated phenotyping technologies are continually advancing the breeding process. However, collecting various secondary traits throughout the growing season and processing massive amounts of data still take great efforts and time. Selecting a minimum number of secondary traits that have the maximum predictive power has the potential to reduce phenotyping efforts. The objective of this study was to select principal features extracted from UAV imagery and critical growth stages that contributed the most in explaining winter wheat grain yield. Five dates of multispectral images and seven dates of RGB images were collected by a UAV system during the spring growing season in 2018. Two classes of features (variables), totaling to 172 variables, were extracted for each plot from the vegetation index and plant height maps, including pixel statistics and dynamic growth rates. A parametric algorithm, LASSO regression (the least angle and shrinkage selection operator), and a non-parametric algorithm, random forest, were applied for variable selection. The regression coefficients estimated by LASSO and the permutation importance scores provided by random forest were used to determine the ten most important variables influencing grain yield from each algorithm. Results: Both selection algorithms assigned the highest importance score to the variables related with plant height around the grain filling stage. Some vegetation indices related variables were also selected by the algorithms mainly at earlier to mid growth stages and during the senescence. Compared with the yield prediction using all 172 variables derived from measured phenotypes, using the selected variables performed comparable or even better. We also noticed that the prediction accuracy on the adapted NE lines (r = 0.58–0.81) was higher than the other lines (r = 0.21–0.59) included in this study with different genetic backgrounds. Conclusions: With the ultra-high resolution plot imagery obtained by the UAS-based phenotyping we are now able to derive more features, such as the variation of plant height or vegetation indices within a plot other than just an averaged number, that are potentially very useful for the breeding purpose. However, too many features or variables can be derived in this way. The promising results from this study suggests that the selected set from those variables can have comparable prediction accuracies on the grain yield prediction than the full set of them but possibly resulting in a better allocation of efforts and resources on phenotypic data collection and processing

    Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System

    Get PDF
    Unmanned aircraft systems (UAS) provide an efficient way to phenotype cropmorphology with spectral traits such as plant height, canopy cover and various vegetation indices (VIs) providing information to elucidate genotypic responses to the environment. In this study, we investigated the potential use of UAS-derived traits to elucidate biomass, nitrogen and chlorophyll content in sorghum under nitrogen stress treatments. A nitrogen stress trial located in Nebraska, USA, contained 24 different sorghum lines, 2 nitrogen treatments and 8 replications, for a total of 384 plots. Morphological and spectral traits including plant height, canopy cover and various VIs were derived from UAS flights with a true-color RGB camera and a 5-band multispectral camera at early, mid and late growth stages across the sorghum growing season in 2017. Simple and multiple regression models were investigated for sorghum biomass, nitrogen and chlorophyll content estimations using the derived morphological and spectral traits along with manual ground truthed measurements. Results showed that, the UAS-derived plant height was strongly correlated with manually measured plant height (r = 0.85); and the UAS-derived biomass using plant height, canopy cover and VIs had strong exponential correlations with the sampled biomass of fresh stalks and leaves (maximum r = 0.85) and the biomass of dry stalks and leaves (maximum r = 0.88). The UAS-derived VIs were moderately correlated with the laboratory measured leaf nitrogen content (r = 0.52) and the measured leaf chlorophyll content (r = 0.69) in each plot. The methods developed in this study will facilitate genetic improvement and agronomic studies that require assessment of stress responses in large-scale field trials

    Authors′ reply

    Get PDF
    Mid- to late-season weeds that escape from the routine early-season weed management threaten agricultural production by creating a large number of seeds for several future growing seasons. Rapid and accurate detection of weed patches in field is the first step of site-specific weed management. In this study, object detection-based convolutional neural network models were trained and evaluated over low-altitude unmanned aerial vehicle (UAV) imagery for mid- to late-season weed detection in soybean fields. The performance of two object detection models, Faster RCNN and the Single Shot Detector (SSD), were evaluated and compared in terms of weed detection performance using mean Intersection over Union (IoU) and inference speed. It was found that the Faster RCNN model with 200 box proposals had similar good weed detection performance to the SSD model in terms of precision, recall, f1 score, and IoU, as well as a similar inference time. The precision, recall, f1 score and IoU were 0.65, 0.68, 0.66 and 0.85 for Faster RCNN with 200 proposals, and 0.66, 0.68, 0.67 and 0.84 for SSD, respectively. However, the optimal confidence threshold of the SSD model was found to be much lower than that of the Faster RCNN model, which indicated that SSD might have lower generalization performance than Faster RCNN for mid- to late-season weed detection in soybean fields using UAV imagery. The performance of the object detection model was also compared with patch-based CNN model. The Faster RCNN model yielded a better weed detection performance than the patch-based CNN with and without overlap. The inference time of Faster RCNN was similar to patch-based CNN without overlap, but significantly less than patch-based CNN with overlap. Hence, Faster RCNN was found to be the best model in terms of weed detection performance and inference time among the different models compared in this study. This work is important in understanding the potential and identifying the algorithms for an on-farm, near real-time weed detection and management
    corecore